AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Text continuation

# Text continuation

Gpt2 Small Cs
A small version of GPT-2 pre-trained on 115GB of cleaned Czech text, suitable for Czech text generation tasks
Large Language Model Transformers Other
G
fav-kky
135
2
Bloom 1b4 Zh
Openrail
Chinese language model developed based on bigscience/bloom-1b7 architecture with 1.4B parameters, reducing GPU memory usage through vocabulary compression
Large Language Model Transformers Chinese
B
Langboat
5,157
18
Spanish Gpt2
MIT
This is a Spanish GPT-2 model trained from scratch on the large_spanish_corpus (BETO corpus) using the Flax framework, developed with support from the HuggingFace community week event.
Large Language Model Spanish
S
mrm8488
971
19
Spanish T5 Small
MIT
This is a small-scale Spanish T5 model trained on the BETO corpus using the Flax framework, supported by the HuggingFace community week event
Large Language Model Spanish
S
flax-community
692
11
Gpt2 Small Turkish
Apache-2.0
This is a fine-tuned version of the GPT2-Small English model, trained on Turkish Wikipedia articles, suitable for Turkish text generation tasks.
Large Language Model Other
G
gorkemgoknar
545
10
Gpt2 Fa
Apache-2.0
ParsGPT2 is a Persian version of the GPT-2 model, developed by the Hooshvare team for Persian text generation tasks.
Large Language Model Other
G
HooshvareLab
5,996
17
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase